This is an 8-bit quantized version converted from the LiquidAI/LFM2-8B-A1B model, optimized for the Apple MLX framework. This model is a mixture-of-experts model with 8B parameters, supporting multilingual text generation tasks.
Natural Language Processing
MlxMultiple Languages